Targeted principal components regression

نویسندگان

چکیده

We propose a principal components regression method based on maximizing joint pseudo-likelihood for responses and predictors. Our uses both predictors to select linear combinations of the relevant regression, thereby addressing an oft-cited deficiency conventional regression. The proposed estimator is shown be consistent in wide range settings, including ones with non-normal dependent observations; conditions first second moments suffice if number (p) fixed, observations (n) tends infinity, dependence weak, while stronger distributional assumptions are needed when p→∞ n. obtain estimator’s asymptotic distribution as projection multivariate normal random vector onto tangent cone parameter set at true parameter, find asymptotically more efficient than competing ones. In simulations our substantially accurate compares favorably partial least squares predictor envelopes. method’s practical usefulness illustrated data example cross-sectional prediction stock returns.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Nonparametric Principal Components Regression

In ordinary least squares regression, dimensionality is a sensitive issue. As the number of independent variables approaches the sample size, the least squares algorithm could easily fail, i.e., estimates are not unique or very unstable, (Draper and Smith, 1981). There are several problems usually encountered in modeling high dimensional data, including the difficulty of visualizing the data, s...

متن کامل

Estimating Invariant Principal Components Using Diagonal Regression

In this work we apply the method of diagonal regression to derive an alternative version of Principal Component Analysis (PCA). “Diagonal regression” was introduced by Ragnar Frisch (the first economics Nobel laureate) in his paper “Correlation and Scatter in Statistical Variables” (1928). The benefits of using diagonal regression in PCA are that it provides components that are scale-invariant ...

متن کامل

Principal Components Regression With Data Chosen Components and Related Methods

Multiple regression with correlated predictor variables is relevant to a broad range of problems in the physical, chemical, and engineering sciences. Chemometricians, in particular, have made heavy use of principal components regression and related procedures for predicting a response variable from a large number of highly correlated predictors. In this paper we develop a general theory that gu...

متن کامل

Maximum likelihood principal components regression on wavelet-compressed data.

Maximum likelihood principal component regression (MLPCR) is an errors-in-variables method used to accommodate measurement error information when building multivariate calibration models. A hindrance of MLPCR has been the substantial demand on computational resources sometimes made by the algorithm, especially for certain types of error structures. Operations on these large matrices are memory ...

متن کامل

Nonlinear Regression Estimation Using Subset-based Kernel Principal Components

We study the estimation of conditional mean regression functions through the so-called subset-based kernel principal component analysis (KPCA). Instead of using one global kernel feature space, we project a target function into different localized kernel feature spaces at different parts of the sample space. Each localized kernel feature space reflects the relationship on a subset between the r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Multivariate Analysis

سال: 2022

ISSN: ['0047-259X', '1095-7243']

DOI: https://doi.org/10.1016/j.jmva.2022.104995